Amazon Echo smart speakers (Amazon)News 

Amazon to make an Apple announces shift of Alexa work from Nvidia to Inferentia chips

Amazon has announced that it will move some of its IT work for the intelligent voice assistant Alexa to its own internal chips, according to a report from Financial Express. The move will see Amazon using its own Inferentia chips instead of the Nvidia chips it has used so far.

This move is obviously bad news for Nvidia, but for Amazon, it will make their work cheaper and faster.

As it currently works, when an Echo user asks Alexa a question, the request is received by one of the data centers set up by Amazon where it is processed in several stages. After the computer responds to the request, the response is translated into audible speech so that the Echo can respond to the user.

This processing was done by Amazon using Nvidia chips. Now, most of that IT work is going to be redirected to Amazon’s own custom chip called Inferentia. Inferentia was first announced in 2018 and the chip was specially designed to speed up machine learning tasks such as image recognition or high volume text-to-speech translations.

Big tech companies like Microsoft, Google, Amazon, etc. who are in the cloud computing industry are the biggest buyers of computer chips, which is a major advantage for companies like Intel and Nvidia.

However, now many companies are moving away from third-party chipmakers and turning to their own in-house solutions.

Right before Amazon announced its move, Apple introduced its own Apple Silicon M1 chip and announced three new Macs that will come with the M1 instead of the Intel chips it has been using until now.

Amazon justified its own decision by saying it saw a 25% improvement in latency and a 30% reduction in costs by moving some of the work to Inferentia.

Related posts

Leave a Comment